fool facial recognition system
How To Fool Facial Recognition Systems
"We've developed a new attack on AI-driven facial recognition systems, which can change your photo in such a way that an AI system will recognise you as a different person, in fact as anyone you want," according to Adversa AI's official website. Adversa managed to trick facial recognition search tool PimEyes into misidentifying Vice reporter Todd Feathers as Mark Zuckerberg. Facial recognition for one-to-one identification has become an increasingly popular AI application. But facial recognition technology is not fool-proof. Adversa AI was designed to fool facial recognition algorithms by adding alterations or noise to the original image.
Top 8 AI-Powered Privacy Tools To Fool Facial Recognition Systems
Facial recognition is one of the most controversial forms of AI. People, communities, and many activist groups have raised concerns against this technology because it jeopardises privacy and compromises an individual's security. While governments of different countries have placed mild to strict restrictions on such tech, there still seems to be a lot of scepticism and fear around. This has given rise to the development of several AI-powered privacy tools that help'fool' facial recognition technology. We have listed some of the most prominent ones. Developed at the University of Chicago's Sand lab, Fawkes very subtly alters photographs at a pixel level to trick facial recognition systems.
- North America > United States > Illinois > Cook County > Chicago (0.25)
- Europe > Netherlands > North Holland > Amsterdam (0.05)
Facebook trained AI to fool facial recognition systems, and it works on live video
Facebook remains embroiled in a multibillion-dollar judgement lawsuit over its facial recognition practices, but that hasn't stopped its artificial intelligence research division from developing technology to combat the very misdeeds of which the company is accused. According to VentureBeat, Facebook AI Research (FAIR) has developed a state-of-the-art "de-identification" system that works on video, including even live video. It works by altering key facial features of a video subject in real time using machine learning, to trick a facial recognition system into improperly identifying the subject. This de-identification technology has existed in the past and there are entire companies, like Israeli AI and privacy firm D-ID, dedicated to providing it for still images. There's also a whole category of facial recognition fooling imagery you can wear yourself, called adversarial examples, that work by exploiting weaknesses in how computer vision software has been trained to identify certain characteristics.
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Vision > Face Recognition (1.00)
'Face stealing' cap uses infrared to fool facial recognition systems
A baseball cap that can fool facial recognition systems into think you're someone else has been developed by scientists. The face-stealing hat projects infrared light - which is invisible to the naked eye - onto your face to trick AI camera systems, which can see the spectrum. Researchers said the technology can not only obscure your identity but also'impersonate a different person to pass facial recognition-based authentication.' A baseball cap that can fool facial recognition systems into think you're someone else has been developed. They added that the face-stealing lights could easily be'hidden in an umbrella and possibly even hair or a wig.' Writing in the pre-publish journal ArXiv, the joint US and Chinese team, led by Dr Zhe Zhou of Fudan University in Shanghai, said: 'We propose a kind of brand new attack against face recognition systems, which is realised by illuminating the subject using infrared. 'Through launching this kind of attack, an attacker not only can dodge surveillance cameras.